DSC 140B
Problems tagged with categorical cross-entropy

Problems tagged with "categorical cross-entropy"

Problem #172

Tags: multiple outputs, lecture-16, categorical cross-entropy

A multi-class classifier has 4 output nodes with softmax activation. The true label is \(\vec y = (0, 0, 1, 0)\) and the softmax outputs are \(\vec h = (0.1, 0.2, 0.6, 0.1)\).

Compute the categorical cross-entropy loss. Leave your answer in terms of \(\log\).

Solution

\(-\log(0.6)\).

By the categorical cross-entropy formula:

\[\ell(\vec h, \vec y) = -\sum_{k=1}^{4}\begin{cases} \log h_k, & \text{if } y_k = 1 \\ 0, & \text{if } y_k = 0 \end{cases}\]

Only \(y_3 = 1\) contributes, so the loss is \(-\log(h_3) = -\log(0.6)\).

Problem #173

Tags: multiple outputs, lecture-16, categorical cross-entropy

A multi-class classifier has 3 output nodes with softmax activation. The true label is \(\vec y = (0, 1, 0)\) and the softmax outputs are \(\vec h = (0.3, 0.5, 0.2)\).

Compute the categorical cross-entropy loss. Leave your answer in terms of \(\log\).

Solution

\(-\log(0.5) = \log 2\).

By the categorical cross-entropy formula:

\[\ell(\vec h, \vec y) = -\sum_{k=1}^{3}\begin{cases} \log h_k, & \text{if } y_k = 1 \\ 0, & \text{if } y_k = 0 \end{cases}\]

Only \(y_2 = 1\) contributes, so the loss is \(-\log(h_2) = -\log(0.5) = \log 2\).

Problem #174

Tags: multiple outputs, lecture-16, categorical cross-entropy

A multi-class classifier has 4 output nodes with softmax activation. The true label is \(\vec y = (1, 0, 0, 0)\) and the softmax outputs are \(\vec h = (0.4, 0.3, 0.2, 0.1)\).

Compute the categorical cross-entropy loss. Leave your answer in terms of \(\log\).

Solution

\(-\log(0.4)\).

By the categorical cross-entropy formula:

\[\ell(\vec h, \vec y) = -\sum_{k=1}^{4}\begin{cases} \log h_k, & \text{if } y_k = 1 \\ 0, & \text{if } y_k = 0 \end{cases}\]

Only \(y_1 = 1\) contributes, so the loss is \(-\log(h_1) = -\log(0.4)\).